English word senses marked with topical category "Machine learning"
Parent categories: Artificial intelligence, Computer science, Cybernetics, Computing, Sciences, Applied mathematics, Systems theory, Technology, Mathematics, Systems, Formal sciences, Interdisciplinary fields, Society
Total 93 word senses
- ANN (Noun) Initialism of artificial neural network.
- BP (Noun) Initialism of backpropagation.
- CNN (Noun) Initialism of convolutional neural network.
- DCNN (Noun) Initialism of deep convolutional neural network.
- DFA (Noun) Initialism of direct feedback alignment.
- DL (Noun) Initialism of deep learning.
- DRL (Noun) Initialism of deep reinforcement learning.
- GAN (Noun) Acronym of generative adversarial network.
- GPT (Noun) Initialism of generative pretrained transformer.
- Kohonen map (Noun) Synonym of self-organizing map
- Kohonen network (Noun) Synonym of self-organizing map
- L.L.M. (Noun) Alternative form of LLM (“large language model”)
- LLM (Noun) Initialism of large language model.
- LLM (Noun) Initialism of logic learning machine.
- LM (Noun) Initialism of language model.
- LoRA (Noun) An adapter-based technique for efficiently fine-tuning models.
- LoRA (Noun) An adapter-based technique for efficiently fine-tuning models.; An adapter.
- MLOps (Noun) A paradigm that aims to deploy and maintain machine learning models in production reliably and efficiently.
- MLP (Noun) Initialism of multilayer perceptron.
- MoE (Noun) Initialism of mixture of experts.
- NMT (Noun) Initialism of neural machine translation.
- Platt scaling (Noun) A technique for transforming the outputs of a classification model into a probability distribution over classes, based on a logistic regression model.
- RAG (Noun) Initialism of retrieval-augmented generation: a method of augmenting performance of LLMs (large language models) by serving them a curated selection of data input, via a combination of relevant data libraries and on-the-fly but relevant search results.
- RL (Noun) Initialism of reinforcement learning.
- RLAIF (Noun) Initialism of reinforcement learning from AI feedback.
- RLHF (Noun) Initialism of reinforcement learning from human feedback.
- SAE (Noun) Initialism of sparse autoencoder.
- SLM (Noun) Initialism of small language model.
- SNN (Noun) Initialism of spiking neural network.
- SOFM (Noun) Initialism of self-organizing feature map.
- SOM (Noun) Initialism of self-organizing map.
- SVM (Noun) Initialism of support vector machine.
- TD (Noun) Initialism of temporal difference.
- VAE (Noun) Initialism of variational autoencoder.
- adapter (Noun) A collection of low-rank matrices which, when added to a base model, produces a fine-tuned model; a LoRA.
- artificial immune system (Noun) Any of a class of computationally intelligent, rule-based machine learning systems inspired by the principles and processes of the vertebrate immune system.
- attention (Noun) A kind of prioritisation technique in neural networks that assigns soft weights between tokens from two (or more) input sequences in order to compute the required output.
- backpropagation (Noun) An error correction technique used in neural networks.
- concept drift (Noun) The tendency of a statistical model to become less and less accurate over time in real-world conditions.
- convolutional neural network (Noun) A neural network in which the connectivity pattern between its neurons is modelled on the organization of the animal visual cortex.
- cross-attention (Noun) A form of attention (machine learning method) where two different input sequences are compared, i.e. the keys and queries differ.
- data poisoning (Noun) The deliberate use of a training dataset with data designed to increase errors in the output of a machine learning model.
- derain (Verb) To remove rain from an image.
- deraining (Noun) The removal of rain from an image or video.
- embedding (Noun) A representation of a unit of text (such as a word or token) as a vector, which encodes the context in which it is used.
- ensemble (Noun) A supervised learning algorithm combining multiple hypotheses.
- epoch (Noun) One complete presentation of the training data set to an iterative machine learning algorithm.
- feature (Noun) An individual measurable property or characteristic of a phenomenon being observed; the input of a model.
- feedforward neural network (Noun) A neural network in which the outputs are not used as input in a previous layer.
- few-shot (Adjective) A machine learning paradigm where a model is trained on a very small amount of data, typically less than that required for traditional machine learning approaches.
- few-shot learning (Noun) An object categorization problem, mainly in computer vision, involving the classification of objects based on only a few examples.
- generative adversarial network (Noun) A system used in machine learning, consisting of two neural networks, one of which generates candidate solutions to a problem while the other evaluates and accepts or rejects them.
- graduate student descent (Noun) The process of choosing hyperparameters manually and in an ad-hoc manner, typical of work assigned to a graduate student.
- ground (Verb) To complement a machine learning model with relevant information it was not trained on.
- hidden layer (Noun) The layer or layers of neurons between the input layer and the output layer in a neural network.
- hyperparameter (Noun) A parameter whose value is set before the learning process begins.
- image matting (Noun) Extraction of an object in the foreground of an image from its background.
- input neuron (Noun) A neuron in the input layer of a neural network, outputting only a constant.
- language model (Noun) A machine learning model that assigns probabilities to sequences of characters or words, and/or is capable of generating plausible subsequent text from a given prompt.
- large language model (Noun) A type of neural network specialized in language, typically including billions of parameters.
- latent space (Noun) A multidimensional spatial representation of latent variables, where related concepts are positioned closer to one another.
- minibatch (Noun) A batch less than one epoch
- model collapse (Noun) Degradation of output quality caused by training on synthetic data.
- multi-armed bandit (Noun) An algorithm that allocates a fixed limited set of resources between competing alternative choices so as to maximize the expected gain, when each choice's properties are only partially known at the time of allocation, and may become better understood as time passes or allocations are made.
- multilayer perceptron (Noun) A neural network having at least one hidden layer, and whose neurons use a nonlinear activation function (e.g. sigmoid).
- neural network (Noun) A real or virtual computer system designed to emulate the brain in its ability to "learn" to assess imprecise data.
- neural processing unit (Noun) A specialized hardware circuit designed to accelerate machine learning applications.
- one-shot learning (Noun) An object categorization problem, mainly in computer vision, involving the classification of objects based on one, or only a few, examples.
- online learning (Noun) The training of a neural network using stochastic gradient descent with a mini-batch size of one; only one input is used for training at a time.
- overfitting (Noun) The production of a statistical model that, due to an excess of parameters relative to the sample size, fits the sample data extremely well but fails to fit new data.
- parameter (Noun) A variable that describes a property or characteristic of some system (material, object, event, etc.) or some aspect thereof.; A variable of a model that is trained by a machine learning algorithm.
- poisoning attack (Noun) Synonym of data poisoning
- pretrain (Verb) To train (a neural network) on some data set (typically a large generic data set, the output of which one is not directly interested in) before fine-tuning the network on another dataset one is interested in.
- pretrained (Adjective) Trained on a (usually large) dataset by someone else.
- prompt (Noun) Textual input given to a large language model or image model in order to have it generate a desired output.
- prompt (Verb) To provide textual input in the form of ordinary language to (an artificial intelligence or language model) to have it generate a desired output.
- proompt (Noun) A prompt.
- proompt (Verb) To prompt; to write a prompt for an AI model.
- random neural network (Noun) A mathematical representation of an interconnected network of neurons or cells which exchange spiking signals.
- recall (Noun) The fraction of (all) relevant material that is returned by a search.
- recurrent neural network (Noun) An artificial neural network where the connections between units give them a kind of internal memory useful for processing arbitrary sequences of inputs as in speech recognition.
- reinforcement learning (Noun) A type of explorative learning via feedback in a simulated environment.
- self-attention (Noun) A form of attention (machine learning method) where input sequence is compared with itself, i.e. the keys and queries are the same.
- self-organizing map (Noun) A kind of artificial neural network trained using competitive learning rather than error correction, and producing a low-dimensional (typically two-dimensional) representation of a higher-dimensional data set while preserving the topological structure of the data.
- self-play (Noun) A technique for improving the performance of reinforcement learning agents by having them play against themselves.
- stochastic parrot (Noun) A language model that uses artificial intelligence to generate seemingly coherent or contextually relevant text but does not truly understand the meaning of the language it is processing
- support vector machine (Noun) A supervised learning model with associated algorithms that analyse data used for classification and regression analysis.
- temperature (Noun) A parameter that controls the degree of randomness of the output.
- train (Verb) To feed data into an algorithm, usually based on a neural network, to create a machine learning model that can perform some task.
- transformer (Noun) A neural network architecture composed of layers of attention which takes sequences of tokens (representing text, images, audio, or other modalities) as input.
- tree kernel (Noun) A tree data structure based on the more general concept of the positive-definite kernel, used in the parsing and classification of sentences.
- zero-shot (Adjective) Relating to zero-shot learning; performing without specialized training data.
- zero-shot learning (Noun) A problem setup in machine learning, where, at test time, a learner observes samples from classes that were not observed during training and attempts to predict the class that they belong to, typically based on auxiliary information.
Download postprocessed JSONL data for these senses (282.1kB)
This page is a part of the kaikki.org machine-readable English dictionary. This dictionary is based on structured data extracted on 2024-12-15 from the enwiktionary dump dated 2024-12-04 using wiktextract (8a39820 and 4401a4c).
The data shown on this site has been post-processed and various details (e.g., extra categories) removed, some information disambiguated, and additional data merged from other sources. See the raw data download page for the unprocessed wiktextract data.
If you use this data in academic research, please cite Tatu Ylonen: Wiktextract: Wiktionary as Machine-Readable Structured Data, Proceedings of the 13th Conference on Language Resources and Evaluation (LREC), pp. 1317-1325, Marseille, 20-25 June 2022. Linking to the relevant page(s) under https://kaikki.org would also be greatly appreciated.